18 resultados para CHD Prediction, Blood Serum Data Chemometrics Methods

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cell transition data is obtained from a cellular phone that switches its current serving cell tower. The data consists of a sequence of transition events, which are pairs of cell identifiers and transition times. The focus of this thesis is applying data mining methods to such data, developing new algorithms, and extracting knowledge that will be a solid foundation on which to build location-aware applications. In addition to a thorough exploration of the features of the data, the tools and methods developed in this thesis provide solutions to three distinct research problems. First, we develop clustering algorithms that produce a reliable mapping between cell transitions and physical locations observed by users of mobile devices. The main clustering algorithm operates in online fashion, and we consider also a number of offline clustering methods for comparison. Second, we define the concept of significant locations, known as bases, and give an online algorithm for determining them. Finally, we consider the task of predicting the movement of the user, based on historical data. We develop a prediction algorithm that considers paths of movement in their entirety, instead of just the most recent movement history. All of the presented methods are evaluated with a significant body of real cell transition data, collected from about one hundred different individuals. The algorithms developed in this thesis are designed to be implemented on a mobile device, and require no extra hardware sensors or network infrastructure. By not relying on external services and keeping the user information as much as possible on the user s own personal device, we avoid privacy issues and let the users control the disclosure of their location information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical weather prediction (NWP) models provide the basis for weather forecasting by simulating the evolution of the atmospheric state. A good forecast requires that the initial state of the atmosphere is known accurately, and that the NWP model is a realistic representation of the atmosphere. Data assimilation methods are used to produce initial conditions for NWP models. The NWP model background field, typically a short-range forecast, is updated with observations in a statistically optimal way. The objective in this thesis has been to develope methods in order to allow data assimilation of Doppler radar radial wind observations. The work has been carried out in the High Resolution Limited Area Model (HIRLAM) 3-dimensional variational data assimilation framework. Observation modelling is a key element in exploiting indirect observations of the model variables. In the radar radial wind observation modelling, the vertical model wind profile is interpolated to the observation location, and the projection of the model wind vector on the radar pulse path is calculated. The vertical broadening of the radar pulse volume, and the bending of the radar pulse path due to atmospheric conditions are taken into account. Radar radial wind observations are modelled within observation errors which consist of instrumental, modelling, and representativeness errors. Systematic and random modelling errors can be minimized by accurate observation modelling. The impact of the random part of the instrumental and representativeness errors can be decreased by calculating spatial averages from the raw observations. Model experiments indicate that the spatial averaging clearly improves the fit of the radial wind observations to the model in terms of observation minus model background (OmB) standard deviation. Monitoring the quality of the observations is an important aspect, especially when a new observation type is introduced into a data assimilation system. Calculating the bias for radial wind observations in a conventional way can result in zero even in case there are systematic differences in the wind speed and/or direction. A bias estimation method designed for this observation type is introduced in the thesis. Doppler radar radial wind observation modelling, together with the bias estimation method, enables the exploitation of the radial wind observations also for NWP model validation. The one-month model experiments performed with the HIRLAM model versions differing only in a surface stress parameterization detail indicate that the use of radar wind observations in NWP model validation is very beneficial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Infection by Epstein-Barr virus (EBV) occurs in approximately 95% of the world s population. EBV was the first human virus implicated in oncogenesis. Characteristic for EBV primary infection are detectable IgM and IgG antibodies against viral capsid antigen (VCA). During convalescence the VCA IgM disappears while the VCA IgG persists for life. Reactivations of EBV occur both among immunocompromised and immunocompetent individuals. In serological diagnosis, measurement of avidity of VCA IgG separates primary from secondary infections. However, in serodiagnosis of mononucleosis it is quite common to encounter, paradoxically, VCA IgM together with high-avidity VCA IgG, indicating past immunity. We determined the etiology of this phenomenon and found that, among patients with cytomegalovirus (CMV) primary infection a large proportion (23%) showed antibody profiles of EBV reactivation. In contrast, EBV primary infection did not appear to induce immunoreactivation of CMV. EBV-associated post-transplant lymphoproliferative disease (PTLD) is a life threatening complication of allogeneic stem cell or solid organ transplantation. PTLD may present with a diverse spectrum of clinical symptoms and signs. Due to rapidity of PTLD progression especially after stem cell transplantation, the diagnosis must be obtained quickly. Pending timely detection, the evolution of the fatal disease may be halted by reduction of immunosuppression. A promising new PTLD treatment (also in Finland) is based on anti-CD-20 monoclonal antibodies. Diagnosis of PTLD has been demanding because of immunosuppression, blood transfusions and the latent nature of the virus. We set up in 1999 to our knowledge first in Finland for any microbial pathogen a real-time quantitative PCR (qPCR) for detection of EBV DNA in blood serum/plasma. In addition, we set up an in situ hybridisation assay for EBV RNA in tissue sections. In collaboration with a group of haematologists at Helsinki University Central Hospital we retrospectively determined the incidence of PTLD among 257 allogenic stem cell transplantations (SCT) performed during 1994-1999. Post-mortem analysis revealed 18 cases of PTLD. From a subset of PTLD cases (12/18) and a series of corresponding controls (36), consecutive samples of serum were studied by the new EBV-qPCR. All the PTLD patients were positive for EBV-DNA with progressively rising copy numbers. In most PTLD patients EBV DNA became detectable within 70 days of SCT. Of note, the appearance of EBV DNA preceded the PTLD symptoms (fever, lymphadenopathy, atypical lymphocytes). Among the SCT controls, EBV DNA occurred only sporadically, and the EBV-DNA levels remained relatively low. We concluded that EBV qPCR is a highly sensitive (100%) and specific (96%) new diagnostic approach. We also looked for and found risk factors for the development of PTLD. Together with a liver transplantation group at the Transplantation and Liver Surgery Clinic we wanted to clarify how often and how severely do EBV infections occur after liver transplantation. We studied by the EBV qPCR 1284 plasma samples obtained from 105 adult liver transplant recipients. EBV DNA was detected in 14 patients (13%) during the first 12 months. The peak viral loads of 13 asymptomatic patients were relatively low (<6600/ml), and EBV DNA subsided quickly from circulation. Fatal PTLD was diagnosed in one patient. Finally, we wanted to determine the number and clinical significance of EBV infections of various types occurring among a large, retrospective, nonselected cohort of allogenic SCT recipients. We analysed by EBV qPCR 5479 serum samples of 406 SCT recipients obtained during 1988-1999. EBV DNA was seen in 57 (14%) patients, of whom 22 (5%) showed progressively rising and ultimately high levels of EBV DNA (median 54 million /ml). Among the SCT survivors, EBV DNA was transiently detectable in 19 (5%) asymptomatic patients. Thereby, low-level EBV-DNA positivity in serum occurs relatively often after SCT and may subside without specific treatment. However, high molecular copy numbers (>50 000) are diagnostic for life-threatening EBV infection. We furthermore developed a mathematical algorithm for the prediction of development of life-threatening EBV infection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last decades there has been a global shift in forest management from a focus solely on timber management to ecosystem management that endorses all aspects of forest functions: ecological, economic and social. This has resulted in a shift in paradigm from sustained yield to sustained diversity of values, goods and benefits obtained at the same time, introducing new temporal and spatial scales into forest resource management. The purpose of the present dissertation was to develop methods that would enable spatial and temporal scales to be introduced into the storage, processing, access and utilization of forest resource data. The methods developed are based on a conceptual view of a forest as a hierarchically nested collection of objects that can have a dynamically changing set of attributes. The temporal aspect of the methods consists of lifetime management for the objects and their attributes and of a temporal succession linking the objects together. Development of the forest resource data processing method concentrated on the extensibility and configurability of the data content and model calculations, allowing for a diverse set of processing operations to be executed using the same framework. The contribution of this dissertation to the utilisation of multi-scale forest resource data lies in the development of a reference data generation method to support forest inventory methods in approaching single-tree resolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Telecommunications network management is based on huge amounts of data that are continuously collected from elements and devices from all around the network. The data is monitored and analysed to provide information for decision making in all operation functions. Knowledge discovery and data mining methods can support fast-pace decision making in network operations. In this thesis, I analyse decision making on different levels of network operations. I identify the requirements decision-making sets for knowledge discovery and data mining tools and methods, and I study resources that are available to them. I then propose two methods for augmenting and applying frequent sets to support everyday decision making. The proposed methods are Comprehensive Log Compression for log data summarisation and Queryable Log Compression for semantic compression of log data. Finally I suggest a model for a continuous knowledge discovery process and outline how it can be implemented and integrated to the existing network operations infrastructure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In meteorology, observations and forecasts of a wide range of phenomena for example, snow, clouds, hail, fog, and tornados can be categorical, that is, they can only have discrete values (e.g., "snow" and "no snow"). Concentrating on satellite-based snow and cloud analyses, this thesis explores methods that have been developed for evaluation of categorical products and analyses. Different algorithms for satellite products generate different results; sometimes the differences are subtle, sometimes all too visible. In addition to differences between algorithms, the satellite products are influenced by physical processes and conditions, such as diurnal and seasonal variation in solar radiation, topography, and land use. The analysis of satellite-based snow cover analyses from NOAA, NASA, and EUMETSAT, and snow analyses for numerical weather prediction models from FMI and ECMWF was complicated by the fact that we did not have the true knowledge of snow extent, and we were forced simply to measure the agreement between different products. The Sammon mapping, a multidimensional scaling method, was then used to visualize the differences between different products. The trustworthiness of the results for cloud analyses [EUMETSAT Meteorological Products Extraction Facility cloud mask (MPEF), together with the Nowcasting Satellite Application Facility (SAFNWC) cloud masks provided by Météo-France (SAFNWC/MSG) and the Swedish Meteorological and Hydrological Institute (SAFNWC/PPS)] compared with ceilometers of the Helsinki Testbed was estimated by constructing confidence intervals (CIs). Bootstrapping, a statistical resampling method, was used to construct CIs, especially in the presence of spatial and temporal correlation. The reference data for validation are constantly in short supply. In general, the needs of a particular project drive the requirements for evaluation, for example, for the accuracy and the timeliness of the particular data and methods. In this vein, we discuss tentatively how data provided by general public, e.g., photos shared on the Internet photo-sharing service Flickr, can be used as a new source for validation. Results show that they are of reasonable quality and their use for case studies can be warmly recommended. Last, the use of cluster analysis on meteorological in-situ measurements was explored. The Autoclass algorithm was used to construct compact representations of synoptic conditions of fog at Finnish airports.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increased mass migration, as a result of economic hardship, natural disasters and wars, forces many people to arrive on the shores of cultures very different from those they left. How do they manage the legacy of the past and the challenges of their new everyday life? This is a study of immigrant women living in transnational families that act and communicate across national borders on a near-daily basis. The research was carried out amongst immigrant women who were currently living in Finland. The research asks how transnational everyday life is constructed. As everyday life, due to its mundane nature, is difficult to operationalise for research purposes, mixed data collection methods were needed to capture the passing moments that easily become invisible. Thus, the data were obtained from photographic diaries (459 photographs) taken by the research participants themselves. Additionally, stimulated recall discussions, structured questionnaires and participant observation notes were used to complement the photographic data. A tool for analysing the activities devealed in the data was created on the assumption that a family is an active unit that accommodates the current situation in which it is embedded. Everyday life activities were analysed emphasizing social, modal and spatial dimensions. Important daily moments were placed on a continuum: for me , for immediate others and with immediate others . They portrayed everyday routines and exceptions to it. The data matrix was developed as part of this study. The spatial dimensions formed seven units of activity settings: space for friendship, food, resting, childhood, caring, space to learn and an orderly space. Attention was also paid to the accommodative nature of activities; how women maintain traditions and adapt to Finnish life or re-create new activity patterns. Women s narrations revealed the importance of everyday life. The transnational chain of women across generations and countries, comprised of the daughters, mothers and grandmothers was important. The women showed the need for information technology in their transnational lives. They had an active relationship to religion; the denial or importance of it was obvious. Also arranging one s life in Finnish society was central to their narrations. The analysis exposed everyday activities, showed the importance of social networks and the uniqueness of each woman and family. It revealed everyday life in a structured way. The method of analysis that evolved in this study together with the research findings are of potential use to professionals, allowing the targeting of interventions to improve the everyday lives of immigrants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims. The main meals that youngsters have during the day are eaten at home and at school. In the Nordic countries breakfast and supper are often eaten with other members of the family. The way that Nordic countries arrange the school lunch and the frequency of family meals differ between countries. However, the challenges related to eating habits of the young are surprisingly similar. The aim of this study is to discuss how the Nordic countries could support youngsters’ healthy eating habits. This study was carried out as a part of a Nordic research project and it completed the work done by Kauppinen (2009) and Niemi (2009) in their Master’s Theses. The research questions are: 1. How do the youngsters evaluate their own eating habits and those of their family? 2. How do the youngsters evaluate the influence of home, family and school on their own eating habits? 3. What kind of relationship exists between eating at home and at school according to the data? Data and methods. A quantitative internet-based survey was used to collect data (N=1539) on the 9th graders conceptions and understandings. The survey consisted of respondents from Finland (N=586), Sweden (N=427), Denmark (N=295) and Norway (N=246). In this study the whole data to the appropriate extent was analyzed. The analysis was done with the SPSS-software and included examination of means, standard deviations, cross-tabulations, Pearson´s correlations, Chi-squared -tests, t-tests and one-way analysis of variance (ANOVA). The results were compaired between the countries and between sexes. Results and discussion. The studied youngsters evaluated their own eating habits positively. There were statistically signifigant differences (p< .05) between countries concerning the people who influence the youngsters’ healthy eating habits. Youngsters from Finland and Sweden considered making healthy choices at school easier than those from Denmark and Norway. Also eating a so called healthy lunch at school was more common in Finland and in Sweden. Eating breakfast and eating a healthy meal at school had a statistically significant interconnection (p< .001). The differences between sexes were not equal between the countries. The results supported those from previous studies, but also raised ideas for further study. Youngsters’ near environments should support their possibilities to make healthy choices and to participate to the decicion making process. Co-operation between the Nordic countries and between the home and the school is important. Listening to the youngsters’ own voice is a challence and a possibility for developing both home economics education and research in this area. Key words: Nordic countries, youngsters, healthy eating habits, eating at home, school meals

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Department of Forest Resource Management in the University of Helsinki has in years 2004?2007 carried out so-called SIMO -project to develop a new generation planning system for forest management. Project parties are organisations doing most of Finnish forest planning in government, industry and private owned forests. Aim of this study was to find out the needs and requirements for new forest planning system and to clarify how parties see targets and processes in today's forest planning. Representatives responsible for forest planning in each organisation were interviewed one by one. According to study the stand-based system for managing and treating forests continues in the future. Because of variable data acquisition methods with different accuracy and sources, and development of single tree interpretation, more and more forest data is collected without field work. The benefits of using more specific forest data also calls for use of information units smaller than tree stand. In Finland the traditional way to arrange forest planning computation is divided in two elements. After updating the forest data to present situation every stand unit's growth is simulated with different alternative treatment schedule. After simulation, optimisation selects for every stand one treatment schedule so that the management program satisfies the owner's goals in the best possible way. This arrangement will be maintained in the future system. The parties' requirements to add multi-criteria problem solving, group decision support methods as well as heuristic and spatial optimisation into system make the programming work more challenging. Generally the new system is expected to be adjustable and transparent. Strict documentation and free source code helps to bring these expectations into effect. Variable growing models and treatment schedules with different source information, accuracy, methods and the speed of processing are supposed to work easily in system. Also possibilities to calibrate models regionally and to set local parameters changing in time are required. In future the forest planning system will be integrated in comprehensive data management systems together with geographic, economic and work supervision information. This requires a modular method of implementing the system and the use of a simple data transmission interface between modules and together with other systems. No major differences in parties' view of the systems requirements were noticed in this study. Rather the interviews completed the full picture from slightly different angles. In organisation the forest management is considered quite inflexible and it only draws the strategic lines. It does not yet have a role in operative activity, although the need and benefits of team level forest planning are admitted. Demands and opportunities of variable forest data, new planning goals and development of information technology are known. Party organisations want to keep on track with development. One example is the engagement in extensive SIMO-project which connects the whole field of forest planning in Finland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment of the outcome of critical illness is complex. Severity scoring systems and organ dysfunction scores are traditional tools in mortality and morbidity prediction in intensive care. Their ability to explain risk of death is impressive for large cohorts of patients, but insufficient for an individual patient. Although events before intensive care unit (ICU) admission are prognostically important, the prediction models utilize data collected at and just after ICU admission. In addition, several biomarkers have been evaluated to predict mortality, but none has proven entirely useful in clinical practice. Therefore, new prognostic markers of critical illness are vital when evaluating the intensive care outcome. The aim of this dissertation was to investigate new measures and biological markers of critical illness and to evaluate their predictive value and association with mortality and disease severity. The impact of delay in emergency department (ED) on intensive care outcome, measured as hospital mortality and health-related quality of life (HRQoL) at 6 months, was assessed in 1537 consecutive patients admitted to medical ICU. Two new biological markers were investigated in two separate patient populations: in 231 ICU patients and 255 patients with severe sepsis or septic shock. Cell-free plasma DNA is a surrogate marker of apoptosis. Its association with disease severity and mortality rate was evaluated in ICU patients. Next, the predictive value of plasma DNA regarding mortality and its association with the degree of organ dysfunction and disease severity was evaluated in severe sepsis or septic shock. Heme oxygenase-1 (HO-1) is a potential regulator of apoptosis. Finally, HO-1 plasma concentrations and HO-1 gene polymorphisms and their association with outcome were evaluated in ICU patients. The length of ED stay was not associated with outcome of intensive care. The hospital mortality rate was significantly lower in patients admitted to the medical ICU from the ED than from the non-ED, and the HRQoL in the critically ill at 6 months was significantly lower than in the age- and sex-matched general population. In the ICU patient population, the maximum plasma DNA concentration measured during the first 96 hours in intensive care correlated significantly with disease severity and degree of organ failure and was independently associated with hospital mortality. In patients with severe sepsis or septic shock, the cell-free plasma DNA concentrations were significantly higher in ICU and hospital nonsurvivors than in survivors and showed a moderate discriminative power regarding ICU mortality. Plasma DNA was an independent predictor for ICU mortality, but not for hospital mortality. The degree of organ dysfunction correlated independently with plasma DNA concentration in severe sepsis and plasma HO-1 concentration in ICU patients. The HO-1 -413T/GT(L)/+99C haplotype was associated with HO-1 plasma levels and frequency of multiple organ dysfunction. Plasma DNA and HO-1 concentrations may support the assessment of outcome or organ failure development in critically ill patients, although their value is limited and requires further evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The superconducting (or cryogenic) gravimeter (SG) is based on the levitation of a super­conducting sphere in a stable magnetic field created by current in superconducting coils. Depending on frequency, it is capable of detecting gravity variations as small as 10-11ms-2. For a single event, the detection threshold is higher, conservatively about 10-9 ms-2. Due to its high sensitivity and low drift rate, the SG is eminently suitable for the study of geodynamical phenomena through their gravity signatures. I present investigations of Earth dynamics with the superconducting gravimeter GWR T020 at Metsähovi from 1994 to 2005. The history and key technical details of the installation are given. The data processing methods and the development of the local tidal model at Metsähovi are presented. The T020 is a part of the worldwide GGP (Global Geodynamics Project) network, which consist of 20 working station. The data of the T020 and of other participating SGs are available to the scientific community. The SG T020 have used as a long-period seismometer to study microseismicity and the Earth s free oscillation. The annual variation, spectral distribution, amplitude and the sources of microseism at Metsähovi were presented. Free oscillations excited by three large earthquakes were analyzed: the spectra, attenuation and rotational splitting of the modes. The lowest modes of all different oscillation types are studied, i.e. the radial mode 0S0, the "football mode" 0S2, and the toroidal mode 0T2. The very low level (0.01 nms-1) incessant excitation of the Earth s free oscillation was detected with the T020. The recovery of global and regional variations in gravity with the SG requires the modelling of local gravity effects. The most important of them is hydrology. The variation in the groundwater level at Metsähovi as measured in a borehole in the fractured bedrock correlates significantly (0.79) with gravity. The influence of local precipitation, soil moisture and snow cover are detectable in the gravity record. The gravity effect of the variation in atmospheric mass and that of the non-tidal loading by the Baltic Sea were investigated together, as sea level and air pressure are correlated. Using Green s functions it was calculated that a 1 metre uniform layer of water in the Baltic Sea increases the gravity at Metsähovi by 31 nms-2 and the vertical deformation is -11 mm. The regression coefficient for sea level is 27 nms-2m-1, which is 87% of the uniform model. These studies are associated with temporal height variations using the GPS data of Metsähovi permanent station. Results of long time series at Metsähovi demonstrated high quality of data and correctly carried out offsets and drift corrections. The superconducting gravimeter T020 has been proved to be an eminent and versatile tool in studies of the Earth dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis examines urban issues arising from the transformation from state socialism to a market economy. The main topics are residential differentiation, i.e., uneven spatial distribution of social groups across urban residential areas, and the effects of housing policy and town planning on urban development. The case study is development in Tallinn, the capital city of Estonia, in the context of development of Central and Eastern European cities under and after socialism. The main body of the thesis consists of four separately published refereed articles. The research question that brings the articles together is how the residential (socio-spatial) pattern of cities developed during the state socialist period and how and why that pattern has changed since the transformation to a market economy began. The first article reviews the literature on residential differentiation in Budapest, Prague, Tallinn and Warsaw under state socialism from the viewpoint of the role of housing policy in the processes of residential differentiation at various stages of the socialist era. The paper shows how the socialist housing provision system produced socio-occupational residential differentiation directly and indirectly and it describes how the residential patterns of these cities developed. The second article is critical of oversimplified accounts of rapid reorganisation of the overall socio-spatial pattern of post-socialist cities and of claims that residential mobility has had a straightforward role in it. The Tallinn case study, consisting of an analysis of the distribution of socio-economic groups across eight city districts and over four housing types in 1999 as well as examining the role of residential mobility in differentiation during the 1990s, provides contrasting evidence. The third article analyses the role and effects of housing policies in Tallinn s residential differentiation. The focus is on contemporary post-privatisation housing-policy measures and their effects. The article shows that the Estonian housing policies do not even aim to reduce, prevent or slow down the harmful effects of the considerable income disparities that are manifest in housing inequality and residential differentiation. The fourth article examines the development of Tallinn s urban planning system 1991-2004 from the viewpoint of what means it has provided the city with to intervene in urban development and how the city has used these tools. The paper finds that despite some recent progress in planning, its role in guiding where and how the city actually developed has so far been limited. Tallinn s urban development is rather initiated and driven by private agents seeking profit from their investment in land. The thesis includes original empirical research in the three articles that analyse development since socialism. The second article employs quantitative data and methods, primarily index calculation, whereas the third and the fourth ones draw from a survey of policy documents combined with interviews with key informants. Keywords: residential differentiation, housing policy, urban planning, post-socialist transformation, Estonia, Tallinn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When authors of scholarly articles decide where to submit their manuscripts for peer review and eventual publication, they often base their choice of journals on very incomplete information abouthow well the journals serve the authors’ purposes of informing about their research and advancing their academic careers. The purpose of this study was to develop and test a new method for benchmarking scientific journals, providing more information to prospective authors. The method estimates a number of journal parameters, including readership, scientific prestige, time from submission to publication, acceptance rate and service provided by the journal during the review and publication process. Data directly obtainable from the web, data that can be calculated from such data, data obtained from publishers and editors, and data obtained using surveys with authors are used in the method, which has been tested on three different sets of journals, each from a different discipline. We found a number of problems with the different data acquisition methods, which limit the extent to which the method can be used. Publishers and editors are reluctant to disclose important information they have at hand (i.e. journal circulation, web downloads, acceptance rate). The calculation of some important parameters (for instance average time from submission to publication, regional spread of authorship) can be done but requires quite a lot of work. It can be difficult to get reasonable response rates to surveys with authors. All in all we believe that the method we propose, taking a “service to authors” perspective as a basis for benchmarking scientific journals, is useful and can provide information that is valuable to prospective authors in selected scientific disciplines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosol particles have effect on climate, visibility, air quality and human health. However, the strength of which aerosol particles affect our everyday life is not well described or entirely understood. Therefore, investigations of different processes and phenomena including e.g. primary particle sources, initial steps of secondary particle formation and growth, significance of charged particles in particle formation, as well as redistribution mechanisms in the atmosphere are required. In this work sources, sinks and concentrations of air ions (charged molecules, cluster and particles) were investigated directly by measuring air molecule ionising components (i.e. radon activity concentrations and external radiation dose rates) and charged particle size distributions, as well as based on literature review. The obtained results gave comprehensive and valuable picture of the spatial and temporal variation of the air ion sources, sinks and concentrations to use as input parameters in local and global scale climate models. Newly developed air ion spectrometers (Airel Ltd.) offered a possibility to investigate atmospheric (charged) particle formation and growth at sub-3 nm sizes. Therefore, new visual classification schemes for charged particle formation events were developed, and a newly developed particle growth rate method was tested with over one year dataset. These data analysis methods have been widely utilised by other researchers since introducing them. This thesis resulted interesting characteristics of atmospheric particle formation and growth: e.g. particle growth may sometimes be suppressed before detection limit (~ 3 nm) of traditional aerosol instruments, particle formation may take place during daytime as well as in the evening, growth rates of sub-3 nm particles were quite constant throughout the year while growth rates of larger particles (3-20 nm in diameter) were higher during summer compared to winter. These observations were thought to be a consequence of availability of condensing vapours. The observations of this thesis offered new understanding of the particle formation in the atmosphere. However, the role of ions in particle formation, which is not well understood with current knowledge, requires further research in future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tiivistelmä ReferatAbstract Metabolomics is a rapidly growing research field that studies the response of biological systems to environmental factors, disease states and genetic modifications. It aims at measuring the complete set of endogenous metabolites, i.e. the metabolome, in a biological sample such as plasma or cells. Because metabolites are the intermediates and end products of biochemical reactions, metabolite compositions and metabolite levels in biological samples can provide a wealth of information on on-going processes in a living system. Due to the complexity of the metabolome, metabolomic analysis poses a challenge to analytical chemistry. Adequate sample preparation is critical to accurate and reproducible analysis, and the analytical techniques must have high resolution and sensitivity to allow detection of as many metabolites as possible. Furthermore, as the information contained in the metabolome is immense, the data set collected from metabolomic studies is very large. In order to extract the relevant information from such large data sets, efficient data processing and multivariate data analysis methods are needed. In the research presented in this thesis, metabolomics was used to study mechanisms of polymeric gene delivery to retinal pigment epithelial (RPE) cells. The aim of the study was to detect differences in metabolomic fingerprints between transfected cells and non-transfected controls, and thereafter to identify metabolites responsible for the discrimination. The plasmid pCMV-β was introduced into RPE cells using the vector polyethyleneimine (PEI). The samples were analyzed using high performance liquid chromatography (HPLC) and ultra performance liquid chromatography (UPLC) coupled to a triple quadrupole (QqQ) mass spectrometer (MS). The software MZmine was used for raw data processing and principal component analysis (PCA) was used in statistical data analysis. The results revealed differences in metabolomic fingerprints between transfected cells and non-transfected controls. However, reliable fingerprinting data could not be obtained because of low analysis repeatability. Therefore, no attempts were made to identify metabolites responsible for discrimination between sample groups. Repeatability and accuracy of analyses can be influenced by protocol optimization. However, in this study, optimization of analytical methods was hindered by the very small number of samples available for analysis. In conclusion, this study demonstrates that obtaining reliable fingerprinting data is technically demanding, and the protocols need to be thoroughly optimized in order to approach the goals of gaining information on mechanisms of gene delivery.